![]() METHOD AND DEVICE FOR DETERMINING NAVIGATION PARAMETERS OF AN AIRCRAFT DURING A LANDING PHASE
专利摘要:
The device (1) comprises a video system (10) comprising at least one digital video camera (11) arranged on the aircraft, the digital video camera (11) being configured to generate on the aircraft current video data relating to at least one feature point on the Earth, whose coordinates are known, and a data processing unit (6) comprising an extended Kalman filter (7) and configured to determine the navigation parameters from current navigation data of the aircraft, from a satellite navigation system, current inertial data of the aircraft, as well as said video data. 公开号:FR3018383A1 申请号:FR1451848 申请日:2014-03-07 公开日:2015-09-11 发明作者:Alain Guillet;Jeremy Vezinet;Anne-Christine Escher;Christophe Macabiau 申请人:ECOLE NATIONALE de l AVIATION CIVILE E N A C;Airbus Operations SAS; IPC主号:
专利说明:
[0001] The present invention relates to a method and device for determining navigation parameters of an aircraft, in particular a transport aircraft, during a landing phase, intended to provide an aid to the navigation of the aircraft. [0002] In the context of the present invention, the landing phase comprises the approach and / or the landing itself. In the usual way, a device for determining navigation parameters provides an estimation of navigation parameters of the aircraft, using baro-inertial data and GNSS data. [0003] In the context of the present invention, the following is meant: by baro-inertial data or measurements (or INS), parameter values, determined by a plant on board the aircraft, for example an intertielle reference system and anemobarometric data type ADIRS ("Air Data and Reference System" in English), which combines both inertial data and barometric data; and - by GNSS or GPS data or measurements (or navigation data), parameter values provided by an onboard receiver, in particular a GPS receiver, which is associated with a global satellite navigation system, in particular of the GNSS type (" Global Navigation Satellite System "in English) associated with a satellite positioning system, in particular of GPS (" Global Positioning System ") type. More particularly, the navigation parameter determining device is intended to determine and provide at least some of the following navigation parameters: position, velocity and attitude parameters of the aircraft, as well as other parameters relating to sensor errors, such as inertial sensor measurement errors, GNSS receiver bias and clock offset, and pseudo-GNSS range correlated measurement errors. The integration of GNSS data with baro-inertial data from INS sensors ("lnnertial Navigation System") is one of the main usual solutions to improve the location of the aircraft in difficult environments, the use of Inertial sensor measurements (INS sensors) in particular to fill gaps between successive GNSS data. In the usual way, such a device generally performs an estimation of the navigation parameters, using a Kalman filter in which the INS measurements and the GNSS measurements are integrated. However, by merging INS measurements and GNSS measurements, the time degradation of the drift accuracy of the INS system must be taken into account. This drift depends on the quality of the INS system. However, the choice of the INS system depends on a compromise between performance and cost. [0004] Thus, the drift can vary for example from 1 meter per minute to several hundred meters per minute according to the INS system, and the cost is also very different between various INS systems. It can therefore be advantageous to have a solution for determining navigation parameters, very precise and low cost. [0005] The object of the present invention is to propose such a solution. It relates to a method for determining navigation parameters of an aircraft during a landing phase, said method comprising steps implemented automatically and repetitively and consisting of: in a first step, to be determined on the aircraft at least: - first data corresponding to current navigation data of the aircraft, from a satellite navigation system; and second data corresponding to current inertial data of the aircraft; and in a second step, calculating the navigation parameters from at least said first and second data, using an extended Kalman filter. According to the invention: the first step further comprises an operation of determining, on the aircraft, video data corresponding to current data relating to at least one characteristic point on the Earth, whose coordinates are known, said video data being generated by at least one digital video camera arranged on the aircraft and observing said characteristic point; and the second step is configured to calculate the navigation parameters also from said video data, in addition to said first and second data. Advantageously, for the implementation of the second step, the extended Kalman filter is configured and adapted to take into account the video data in order to calculate the navigation parameters. Thus, thanks to the integration of video data, it is possible to improve the performance (precision, integrity and availability) of the navigation parameters determined using the extended Kalman filter. In particular, it is thus possible to obtain precise navigation parameters, suitable for use on the aircraft to assist navigation during the landing phase (approach and / or landing), using this effect of the baro-inertial data obtained for example from an INS system at a lower cost. The video camera or cameras used for the implementation of the present invention may be: either one or cameras already existing on the aircraft, which is particularly the case on modern commercial aircraft; or one or more cameras dedicated to the implementation of the present invention. Advantageously, in the first step of the method, the video data comprise the following angular measurements: a first measurement, at a focal point of the video camera, of a first angle between an axis of a reference mark of the aircraft and a projecting on a vertical plane a line of sight of the video camera, observing said characteristic point; and a second measurement, at a focal point of the video camera, of a second angle between the axis of the reference mark of the aircraft and a projection on a horizontal plane of the line of sight of the video camera, observing said point feature. Furthermore, advantageously, the second step of the method takes into account an observation matrix comprising a first observation matrix relating to said first data and a second observation matrix relating to said video data, and said second observation matrix. includes the tangent of the first measurement and the tangent of the second measurement which are defined with respect to the following parameters: - the latitude, longitude and height relative to the ground of the aircraft; and the angles of roll, pitch and yaw of the aircraft. In the context of the present invention, the characteristic point on the Earth which is used for the measurement of video data, can represent any point which can be identified by the video camera (and located on the images taken by the video camera), and whose coordinates are known. Preferably, this characteristic point corresponds to a particular point of a track provided for a landing of the aircraft during the landing phase, and in particular the threshold of the runway. [0006] Furthermore, advantageously, said navigation parameters comprise at least some of the following data: a position parameter of the aircraft; a velocity parameter of the aircraft; a parameter of attitude of the aircraft; and - at least one parameter relating to an error of at least one sensor. The present invention also relates to a device for determining navigation parameters of an aircraft during a landing phase. Said device of the type comprising: a first data generation unit, configured to determine first data corresponding to data; navigation charts of the aircraft, derived from a satellite navigation system; a second data generating unit, configured to determine second data corresponding to current inertial data of the aircraft; and a data processing unit comprising an extended Kalman filter configured to determine the navigation parameters from at least said first and second data, is remarkable, according to the invention, in that said device further comprises a video system comprising at least one digital video camera arranged on the aircraft, the digital video camera being configured to generate on the aircraft current video data relating to at least one characteristic point on the Earth, whose coordinates are known, and in that the data processing unit is configured to determine the navigation parameters also from said video data. In a particular embodiment, said device also comprises navigation parameter user means determined by the data processing unit. [0007] The present invention also relates to a navigation system of an aircraft, which comprises the aforementioned device. The present invention further relates to an aircraft, in particular a transport aircraft, which comprises a device and / or a navigation system, such as those mentioned above. The appended figures will make it clear how the invention can be realized. In these figures, identical references designate similar elements. Figure 1 is a block diagram of a particular embodiment of a device according to the invention. Figures 2 to 5 are schematic representations for explaining the relationships between different parameters and benchmarks used in the implementation of the present invention. The device 1 illustrating the invention and shown schematically in FIG. 1 is intended to determine navigation parameters of an aircraft AC (FIG. 2), in particular of a transport aircraft, during a landing phase (FIG. approach and / or landing proper) of the AC aircraft on an airstrip of an airport. Said device 1 which is embedded on the aircraft AC comprises, in the usual way: a data generation unit 2 which comprises at least one usual receiver 3 (preferably a GPS receiver) which is associated with global navigation system by satellites , in particular of the GNSS ("Global Navigation Satellite System") type, associated with a satellite positioning system, in particular of the GPS ("Global Positioning System") type. The data generating unit 2 is configured to determine, in the usual way, first data corresponding to current navigation data (or GPS or GNSS data) of the aircraft AC (on the basis of the information received by the associated receiver 3 satellite navigation system); - A data generation unit 4, for example an inter-reference system and anemobarometric data type ADIRS ("Air Data and Reference System" in English), which combines both inertial data and barometric data. The data generating unit 4 comprises a plurality of usual inertial sensors 5A to 5M (INS sensors), M being an integer, and is configured to determine, in the usual way, second data corresponding to inertial data (or baro data). -inertielles or INS) of the AC aircraft; and a data processing unit 6 which comprises an extended Kalman filter 7 and which is connected via links 8 and 9, respectively, to said units 2 and 4. The data processing unit 6 is configured to determine the navigation parameters, from at least said first and second data received from the units 2 and 4 via the links 8 and 9. According to the invention, said device 1 further comprises a video system 10 comprising at least one camera digital video 11 which is arranged on the aircraft. [0008] The video camera 11 is arranged on the aircraft AC so as to take video images of the external environment at the front of the aircraft AC. This video camera 11 is configured to generate on the aircraft AC current video data relating to at least one characteristic point 12 on the Earth T, whose coordinates are known. To do this, the device 1 may comprise a database 14 which contains the coordinates (longitude and latitude in particular) of said characteristic point 12 and which is, for example, integrated in the data processing unit 6, this characteristic point 12 (or target point: "target" in English) may be any point that can be identified by the video camera 11 (and located on the video images taken by the video camera 11), and whose coordinates are known. Preferably, this characteristic point 12 corresponds to a particular point of a track provided for a landing during the landing phase, and in particular the threshold of the runway. The video system 10 also includes a conventional video data processing unit 21 which processes the data generated by the video camera 11 and which provides the video data specified below. [0009] In addition, according to the invention, the data processing unit 6 is configured to determine the navigation parameters also from the video data generated by the video system 10 and received via a link 13, as specified below, in addition to the first and second data above. [0010] To do this, the extended Kalman filter 7 of the data processing unit 6 is configured and adapted to take into account the video data in order to calculate the navigation parameters, as specified below. Thus, thanks to the integration of the video data, the device 1 is able to improve the performance (accuracy, integrity and availability) of the navigation parameters determined using the extended Kalman filter 7. In particular, the device 1 thus makes it possible to obtain precise navigation parameters that can be used on the AC aircraft to assist navigation during the landing phase, by using for this purpose baro-inertial data obtained, for example, from a unit 4 at a lower cost. The video system 10 used for the implementation of the present invention may comprise: either one or more cameras already installed on the aircraft, which is particularly the case on modern commercial aircraft; or one or more cameras dedicated to the implementation of the present invention and installed specifically for this purpose. In a particular embodiment, said device 1 also comprises a set 15 of onboard user means of the aircraft AC, for example cockpit display systems and / or computers, control of the control surfaces of the aircraft AC ( flight control) or guidance of the aircraft AC (autopilot), which use the navigation parameters determined by the data processing unit 6 (and received via a link 19). The present invention also relates to a navigation system of the aircraft AC, which comprises said device 1 and which is responsible for the navigation of the aircraft AC in particular during the landing phase. [0011] The data processing unit 6 comprises an extended Kalman filter 7 with an error state vector which estimates errors of the position, velocity and attitude parameters, as well as other parameters relating to errors. sensors, such as inertial sensor measurement errors (5A to 5M sensors), bias and clock offset of a GNSS receiver 3, and correlated GNSS pseudo-distance measurement errors. In the following description of the invention, the following parameters are considered: a vector of instantaneous rotation of a marker A with respect to a marker B, as specified below, which is expressed in the coordinate system of the marker AT ; naa 1b: the associated symmetric matrix (of the "skew" type), also denoted w, / 5A; Ra2b: a rotation matrix for the vector transformation of the coordinate system of marker A, to the coordinate system of marker B; and vea and faa, b: respectively a speed of the aircraft AC relative to the Earth, expressed in the coordinate system of the reference mark A, and a vector of specific force of the mark A with respect to the mark B, expressed in the system Moreover, it is shown in FIG. 2 (which very schematically shows a current position PC of the aircraft AC relative to the ground T) of some of the landmarks used in the implementation of the invention. . More precisely, the reference marks involved in the implementation of the invention are the following: A / (X, Y, Z) is an inertial reference (or reference I). It is defined as a benchmark, in which Newton's laws of motion apply. The origin O of the inertial reference point coincides with the center of mass of the Earth T. The X axis is directed towards the vernal point ("vernal equinox" in English), the Z axis is directed along the axis of rotation of the Earth T, and the Y axis is defined to complete the right hand coordinate system; B / (X ,, Y ,, Ze) is a landmark called E (or ECEF landmark for "Earth-Centered Earth-Fixed frame"). Its origin O is fixed at the center of the Earth T. The axis Ze is aligned with the axis Z of the inertial reference. The reference ECEF rotates relative to the inertial reference at a frequency of: com P. 7.292115.10-5 rad / s In the ECEF reference system, two coordinate systems may be used: - the rectangular ECEF coordinate system (X, Y, Z); and - the geodesic ECEF coordinate system which introduces the latitude, longitude and altitude (2.0, h) parameters. [0012] The relationship between the two coordinate sets is: X = (RE + h) .cos .1 ... cos 0 Y = (RE + h) .cos .1 ... sin 0 Z = ((1 - e2) .RE + h). sin 2 where e = 0.0818 is the eccentricity and RE is the terrestrial radius; C / (N, E, D) is a geographical navigation mark (NED mark or N mark) which is defined locally with respect to the geoid of the Earth T. The Zn (D) axis is directed inward of the ellipsoid along the normal to the ellipsoid. The Xn (N) axis is pointing North, and the Y (E) axis is pointing East to complete the right hand coordinate system. The origin of the reference is the projection of the origin of the platform on the geoid of the Earth T. The rate of inertial rotation of the Earth T, expressed in the reference N, is: D / (Xn, Yb, Zb) is a body or mobile mark (mark B or mark M) which is rigidly connected to the vehicle in question, in this case to the aircraft AC, usually at a fixed point such as its center of gravity G (FIG. ). This marker is also called the aircraft reference mark below. [0013] The axis X ,, is defined in the direction of advance, the axis Zb is defined to point towards the bottom of the vehicle (aircraft AC), and the axis Yb is defined to point to the right of the vehicle (aircraft AC ) to complete the right hand coordinate system, as shown in Figure 2; E / (X 'Y' Z) is a platform mark (P mark) which is P P P considered to be aligned with the body mark (aircraft mark); and F / (X ,,,, Y ', Z) is an azimuth reference mark called "Wander" (W mark) which solves the problem of high latitude encountered by the geographical mark. The definition is expressed in terms of the angular velocity of the reference with respect to the reference of the Earth. Thus, if the instantaneous rotation vector of the marker N relative to the marker E, expressed in the N mark, is: then the instantaneous rotation vector of the mark W with respect to the mark E, expressed in the mark N, is: 0. cos (.I.) 0 The data received and used by the data processing unit 6 of the device 1 are now specified. First, in the usual way, the unit 2 provides, as current navigation data, Pseudo-distance measurements, at the data processing unit 6, via the link 8. Second, the unit 4 provides (via the link 9), as current inertial data, the following data: - The h = { = fx, f, '1.1: a measurement of the specific force of the aircraft AC relative to the inertial mark, expressed in the reference of the aircraft (or mobile marker); Fo: h = Px, d) oz]: a measurement of the speed of rotation of the aircraft AC relative to the inertial mark, expressed in the reference of the aircraft; an estimate of the latitude, the longitude q3 and the baro-inertial altitude h8 of the aircraft AC; - i) in = PN, i) r, i3ol: an estimate of the velocity of the aircraft AC relative to the Earth, expressed in the NED reference (or N mark); - An '"' = 40, ij, tj: an estimate of the angles of roll, pitch and yaw of the aircraft AC with respect to the reference N, expressed in the reference of the aircraft; ): an estimate of the "Wander" azimuth angle which defines the angle between the North axis of the N mark and the XH axis, of the W mark, and - am2, an estimate of the rotation of the reference mark of the aircraft at mark N. [0014] Thirdly, the video system 10 detects from the aircraft AC one or more characteristic points 12 of the external environment, and in particular of the environment of the runway intended for landing, and it provides, for each characteristic considered, the following data, as shown in FIG. 3 (illustrating a field of view 16 of the video camera 11 of the video system 10): a first measurement -c, at a focal point 17 of the video camera 11, of a first ax angle between the axis XI, the aircraft mark and a projection LOS1 on a vertical plane of a line of sight LOS (for "Line Of Sight" in English) of the video camera 11 which observes said characteristic point 12 (ie whose field of view includes said characteristic point 12). Line of sight LOS connects points 12 and 17; and a second measurement, at said focal point 17 of the video camera 11, of a second angle ay between the axis Xb of the reference mark of the aircraft and a projection LOS2 on a horizontal plane of the line of sight LOS of the video camera 11 which observes said characteristic point 12. The data processing unit 6 uses the preceding data to determine the navigation parameters using the extended Kalman filter 7. The following treatments are presented below. by the extended Kalman filter 7, first specifying the use of barointertial data and navigation data (GPS data) before specifying the use of the video data. On the other hand, in the description below: for an estimated parameter, the difference between the true value x of the parameter and the estimated value .X is noted: 8x; and for a measured parameter, the difference between the true value x of the parameter and the measured value is noted: Sx. The radius of curvature RM of the Earth T along a meridian 18 (Figure 2) and the radius RN normal to the meridian 18, are defined by: RM = (a (1-e2)) / (1-e2 sin2 2) 3/2 RN = a 1 (1- e2 .sin2 2) 1/2 where a is the half major axis of the earth ellipsoid. The radii of curvature in the X, and Y ,, directions of the W mark are: ## EQU1 ## RN, + 1'0 + sin2 wl (RN +118) 11 (R, y + hB) = cos w.sin w. (1 / (R, + hB) - 1 l (Rm + IO) Moreover, the components of the error state vector 8X for the processing of the extended Kalman filter 7 (implemented by the data processing unit 6) are, successively, the following: - dOx., dO y: position errors horizontal angle in the W mark - dhB: the baro-inertial altitude error - daB: the output of a compensator of the servo loop of the third order baro-inertial loop - dVx, dVy: horizontal velocity errors in the W: - dV mark: the baro-inertial vertical velocity (along the vertical axis downwards) - ddb dd, dbb,: alignment angle errors in the W mark - bgx, bgy, bg,: gyroscopic bias errors in the aircraft mark - bar, bar, bu,: Accelerometer bias errors in the aircraft mark; - bH: the bias error of the clock of the GPS receiver (receiver 3); - dH: drift error of the GPS receiver clock (receiver 3); and - errGA, ssi, ..., errGA, ssN: the correlated errors of pseudo-GPS distances (for N satellites). First of all, the propagation matrix is specified. For this purpose, firstly, the horizontal angular position error transition matrix is: 1 / (Rxy + h / 3) 1 / (Ry + hB) ..... 0 -1 / (R '+ 1 / B) 1 / (Rxy + hB) ... 0 0_ F0 = ps Secondly, concerning the baro-inertial altitude error transition matrix, the equations of the barointertial altitude error model are: C511 B = - Vz KI (8hB-Shbara) S8 = K3. (8hB-Shb'0) where Ki, K2, and K3 are the gains of a baro-inertial loop of the third order. The transition state matrix is then: 0 0 -K1 0 0 -1 0 Fbaro - [0 0 0 K3 0 0 0 001 The complete 3D position error transition matrix is thus: F3 Dpos Fp0, Third, with respect to the horizontal velocity error transition matrix, the horizontal velocity error equation INS in the W coordinate is derived from that of the navigation marker, further specified: er: = + agew + (A). (8co '+ The gravity error vector is: 00 0 00 0 0 0 -2go / REI 8g "= 8g' = is the gravity model error, and go is the gravitational constant. of gravity is not modeled in the state vector These errors are embedded in velocity state noise vector For the horizontal velocity error equation, the first two lines of gravity error are negligible: Evelol haro = 0 The calculation of Sco, e1-28K1 is done using the expression of the partial derivatives. e first calculate the following partial derivatives: sconwie = (5wuev '[8 (0'0, h) (Dcov' ((0 ,,, Oy, h), v4:) I 3 (0, Oy, h), y ':) 1 av, 7 = (0, 0, h) 8v, 7 The rotation of the mark W with respect to the mark E and the rotation of the mark E with respect to the mark I, expressed in the mark W, are: cos w sinw 0 -sin w cos w) 0 0 I e = cosw sinw 0 -sin w cosw 0 0 0 1 0 - ive, / sin wei, cos / 1, Wewl = Rn 2 w Wwn 1 ea co ove '= [o] Fve, o, not, = (acow' a (eey, h)) + 2. (acifel, 1 / a (ex, O, h)) Fvelol velol (a F, eln I velo2 e tov cv) +2. (saco: /, I Wev), A + 2co, A 20 Fvelolbg = ° Fvelo I ba = Rm2w The dynamic velocity error matrix of velocities, quoted horizontally, includes the following successive parameters: Fvelnl not =) ew A) -Folol posl velol haro = ° 3X2 f Fvelol vela = e "1 'velol velol velo veto 2 Fvelof,' = (-. 7wA) FveloINS Fvelolbg = ° 3X3 Fvelolha = Rm2w The dynamic matrix of error Final horizontal velocity is composed of the first two lines of the Fveloes matrix. The baro-inertial vertical speed error transition matrix is calculated below. Fourth, with respect to the baro-inertial vertical velocity error transition matrix, the equation of the baro-inertial vertical velocity error model is: = 8f - 2.g0.8118 IRE + Sa, + K2. (5h8 Ôhba, u) with fczo ,, the specific vertical force of the AC aircraft with inertial corrections. The transition state matrix is then: rbaro speed = [o (K2 - 290 1 01x8 RE 1 0 The complete 3D velocity error transition matrix is: Fhori veto F3D veto = p L baro speed Fifth, concerning the Trim error transition matrix, the INS error in the W coordinate is: A = f3 + fim2w aaet / i 84 / i- Calculation of cSco ': and of is defined from the corresponding equations given above: vq; = co: /, + coi / Fattl pos = - ((awwle I a (e, '0 y, h) + a (o', ey, h) ) Fa '/' 10 = - ((aco :: 1 e I (aco'ii av :)) Fan I = (-Www IA) 'Made I bg tem2w Strong I ba = The dynamic matrix of plate is written then: Fatt = [Fatt / pos Fatt / velo Fatt / att Fatt / bg Fatt / bal And the dynamic matrix INS is: FINS On the other hand, sixthly, the discrete transition matrix of sensor measurements is: At e TgYr ° 0.000O At - O e rgYr ° OOOO _ At 0 0 e TgYr ° 0 0 0 At 0 0 00 0 At e Taccel 0 0 0 00 e raccei Fbiases = At 0 O 0 Other s, seventh, concerning the drift and the bias of the clock of the GPS receiver (receiver 3), the transition processing of the drift and the bias of the clock is: bH1 + [no cl,] nh 0 0 The matrix Fi drift transition dock and clock bias is: 0 Fclock 0 0 By the way, eighth, regarding the correlated errors of pseudo-GPS distances, the FerrGPS discrete transition matrix of correlated errors of pseudo-distances, is : 0 0 Ferre 0 -AllterrGps The final GPS dynamic matrix FGps is then: r, 0 / 'dock 0 FerrGPS FGPS = And the global dynamic matrix is: F = [FINSVMU O ° FGPS We now define the noise covariance matrix of state, relating to the extended Kalman filter 7. First, the state noise covariance matrix INS and IMU is: qiNs / Imu qt9x 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 goy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 a, n8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 qvx 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 qvy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 qvz 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q0x 0 0 0 0 0 0 0 0 - 0 0 0 0 0 0 0 qq5y 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 q / cz 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 qbgx 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 a, bgy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 qbgz 0 0 0 OOOOOOOOOO 0 O a -ibax OO 0 0 0 0 0 0 0 0 0 0 0 0 0 a 0 -ibay 0 0 0 0 0 0 0 0 0 0 0 0 0 qba2 The diagonal components reflect the dynamics of the evolution of the inertial error state as adapted to the inertial classes. Secondly, the GPS state noise covariance function is: Gro 0 0 0 0 0 0 Cff 0 0 0 0 0 a r 0 0 er. GPS 1 0 0 0 0 0 0 0 0 cr - errciDs 2 The diagonal components reflect the dynamics of the evolution of the inertial error state as adapted to the receiver type. The global state noise covariance matrix is thus: CIGPS INS / IMU oo qGPS _ 5 = We now define the observation matrix, relative to the extended Kalman filter 7. Concerning the observation matrix, firstly, the Pseudo-distance observation function GPS is: hiGps (x (k)) = pi (k) + c - b (k) 10 pi (k) = (k) - X (k)) 2 + (y (k) ))) (z (k) - Z (k)) 2 with: - (X, Y, Z) the coordinates of the position estimated by an INS sensor (sensor 5A, 5M) in the E mark; and - (X ', Y', Z ') the coordinates of the position of the ith satellite in the reference E. [0015] The linear observation matrix HGPS is then: i iah i ai a hi Mi 0 - - aoy ahB 0 -nu 0 0 .- '01 u The H a hi a pi a (X, Y, Z) a ( . C c c (((((((((((((((((((((((((((((((((((((((((((((((((X, Y, Z) dp 'a (x, Y, Z) p Pi Pi a (X, Y, Z) - (RE -I- sin A, (/) - (RE + hB) - cos. sin (/) cos A - cos 0- - sin If i i- (RE + hB) cos --cos (/) cos - - sin (/) - RE + 118 (,, 0 sin - - sin w - cos w 0 In addition, secondly, concerning the observation matrix of the video measurements generated by the video system 1 0, the observation matrix of optical angular measurements is: gideo (FIG. x (k)) = tan Gal) tan (ax) = tan (ay) = Optical angle measurements are defined with respect to tan (ax) = fx (il, hB, 0) .cos + f (, hB , v) sin ço tan (ay) = - 0, 118, 0). sin ço + fy (.1, hB, v). cos ço f, (/ 1.0, hB4O) .tan (0 + 2 -aD) = (1+ tan year tan 0) / (tan aD - tan 0) L = RT - sin GC A dAit 2 RT sin2 (GCA 2 1 = RT - (1 - cos (GCA)) tan a D = gx (A, (1), hB) L dAit + Mit RT - sin (GCA) - RT - (1 - cos (GCA)) + Mit RT - sin (GCA) RE + h8 - RT - cos (GCA) For the previous treatments, reference is made to FIG. 4 which illustrates the observation model for vertical optical angular measurements. In this figure 4, there is shown in particular a (ex, oym horizontal 20 (in relation to the center of mass G of the aircraft AC) and an ellipsoid 22. It is known that half of the sinus pours ("haversine" in English) an angle is defined by: haver sin e (x) = sin2 (x / 2) = (1 - cos (x)) / 2 The half of the sinus pours ("haversine") from the angle of Great Circle Angle (GCA) is: haver sin e (GCA) = haver sin e (A) I. - Açb) + sin A. sin A q5.haver sin e (71- / 2) = (1- cos A. cos Açb) / 2 The large circle angle GCA is: GCA = k (2, = 2 - arcsin / - cos LU. cos 3,0 2 = arccos (cos LU - cos 34)) fy (2, 0, hB, = tan (a N - iP) tan aN - tan q = -. 1 + tan aN - tan Citat = 2 - RT '- sin () diong = 2 - RT - sin (i-PA tan aN = diong sin 4) sin (7/1) We refer for the previous treatments to figure 5 which illustrates the observation model for the measurements In this FIG. 5, there is shown in particular a graph defined with respect to East E and North N. The functions which describe the observation function are: hi (x, y, z) = x. cos z + y- sin z h2 (x, y, z) = -x - sin z + y - cos z 1 + x - yx - yx - yy (x, Y) = 1 + x - y RT - sin x 9x (x, Y) = RE + y - RT - cos x sin (X -21) 1 9 y (X, v sin e -2-T) fx (x 31) - cos (x - .1T) - cos (y - OT) 2 = arccos (cos (x - 2T) - cos (y - cbT)) The linearized observation matrix 1 ,, '' is: k (x, y) = 2 arcsin [a hl = [aox ahi- "a fx xaf xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx ak a (P) ahi afy x 17 X 1 -j-2. X 7Fx To X 79-; ur a7; Hvi ideo x (agY x + a9Y aex aox ahi alti fx agx (ak a, ak ao) ahi a fy-aTy = x (a X 9Y X "+ ag) / X d (1) garlic as, / aoy ahi ahB ahi- a0x- afx ahi ahi has 45y a fx ahi af afxx agx afx ao XX 7 (--px afx ao x (30 X a 03, a9 x 717 ahi Dy) ahi ap + x cp 03, ah 'alti af ## EQU2 ## _ l- sin (q) cos (go) l ah [-f. sin (49) + fy'cos (40)} aço = [-fx-cos () - IV - sill ((P) Ô fx1 + tan2 (0) agx = (tan (O) -gx) 2 afx 1 + g, 2, O tan (8) = (tan (6) -gx) 2 a fx a fx a tan (e) ae-a tan ( 0) x 00 0 tan (0) 1 = 1 + tan2 (0) = a0 cos2 (0) afy _ 1+ tan2 (tP) agy - (1 + tan () - gy) 2 Of 1 + 9 Nat () - (1 + tan (iP) - gy) 2 a fy _ a fy tan (tp) atp tan (tp) x av) tan (iP) 1 = 1 + tan2 (IP) = cos2 (p) ## EQU1 ## (cos + RE) cos - -Ri = ak ((RE + h8) -RT-cos W) 2 a gx RT - sin w O h5. ((RE + Iiii.) - RT - cos W) 2 agy = cos (11.) - sin (-1) gy .9 / 1 2 - sin2 (.1) 2. tan (41'a) 2 2 Alagy 1 cos 4 2 The partial derivatives of k by ratio t to .1, and to 0 are: ak-cos (4) sin () - cos (4) - cos2 () - cos2 (Aq5) sin (k) a kcos (A.1) - sin (4) 797 /) = sin (k) a (, 0) sin w - cos w i cos w sin w cos λ. cos. 5 (q, 0) [cos w - sin wi ax oy) sin w cos w From the above data, the overall observation matrix H which corresponds, in the context of the present invention, to: H GPS - H void ° _ Moreover, concerning the measurement noise covariance matrix: - the pseudo-distance GPS measurement noise covariance RGPS matrix is: RGPS = aERE Inbpseudorange - the matrix R ,,, deo of noise covariance video measurement is: Rvideo = 2i evdeo I2xnbtarget - and finally, the global noise covariance matrix R is: H = RGps oo Rvideo R The device 1, as described above, integrates video data, which allows to improve the accuracy of the navigation parameters determined using the extended Kalman filter 7 (appropriately adapted, as specified above) of the data processing unit 6 of the device 1. The device 1 thus provides a aids to the navigation of the AC aircraft, in particular when approaching and / or wandering, being for example integrated in a navigation system of the aircraft AC. In addition, the integration of the additional visual measurements (video data), derived from a video processing of at least one digital video camera 11, makes it possible to obtain a device 1 and / or a navigation system, which are autonomous ( and which do not require, in particular, ground stations).
权利要求:
Claims (10) [0001] REVENDICATIONS1. A method for determining navigation parameters of an aircraft during a landing phase, said method comprising steps implemented automatically and repetitively and consisting of: - a first step, to be determined on the aircraft (AC ): - first data corresponding to current navigation data of the aircraft (AC), from a satellite navigation system; and second data corresponding to current inertial data of the aircraft (AC); and in a second step, calculating the navigation parameters from at least said first and second data, using an extended Kalman filter (7), characterized in that: the first step comprises, in addition, an operation of determining, on the aircraft (AC), video data corresponding to current data relating to at least one characteristic point (12) on the Earth (T), whose coordinates are known, said video data being generated by at least one digital video camera (11) arranged on the aircraft (AC) and observing said characteristic point (12); and the second step is configured to calculate the navigation parameters also from said video data, in addition to said first and second data. [0002] 2. Method according to claim 1, characterized in that, for the implementation of the second step, the extended Kalman filter (7) is configured and adapted to take into account the video data in order to calculate the navigation parameters. [0003] 3. Method according to one of claims 1 and 2, characterized in that in the first step, the video data comprises the following angular measurements: - a first measurement, at a focal point (PF) of the video camera (11 ), a first angle (ai) between an axis (Xb) of a reference mark of the aircraft and a projection (LOS1) on a vertical plane of a line of sight (LOS) of the video camera (11) observing toward said characteristic point (12); and a second measurement, at a focal point (PF) of the video camera (11), of a second angle (ay) between the axis (Xb) of the reference mark of the aircraft and a projection (LOS2) on a horizontal plane of the line of sight (LOS) of the video camera (11) observing said characteristic point (12). [0004] 4. Method according to claim 3, characterized in that the second step takes into account an observation matrix comprising a first observation matrix relating to said first data and a second observation matrix relating to said video data, and in that said second observation matrix comprises the tangent of the first measurement and the tangent of the second measurement which are defined with respect to the following parameters: the latitude, longitude and height relative to the ground of the aircraft (AC); and the angles of roll, pitch and yaw of the aircraft (AC). [0005] 5. Method according to one of claims 3 and 4, characterized in that said characteristic point (12) on the Earth (T) represents a particular point of a track provided for a landing of the aircraft (AC) during the landing phase. [0006] 6. Method according to any one of the preceding claims, characterized in that said navigation parameters comprise at least some of the following data: a position parameter of the aircraft (AC); an aircraft velocity parameter (AC); an aircraft attitude parameter (AC); and - at least one parameter relating to an error of at least one sensor. [0007] 7. Device for determining navigation parameters of an aircraft during a landing phase, said device (1) comprising: a first data generating unit (2) configured to determine first data corresponding to data; aircraft navigation systems (AC), derived from a satellite navigation system; a second data generating unit (4) configured to determine second data corresponding to current inertial data of the aircraft (AC); and - a data processing unit (6) comprising an extended Kalman filter (7) and configured to determine the navigation parameters from at least said first and second data, characterized in that said device (1) comprises, further, a video system (10) comprising at least one digital video camera (11) arranged on the aircraft (AC), the digital video camera (11) being configured to generate on the aircraft (AC) current video data relating to at least one characteristic point (12) on the Earth (T), whose coordinates are known, and in that the data processing unit (6) is configured to determine the navigation parameters also from said video data . [0008] 8. Device according to claim 7, characterized in that it comprises user means (15) of navigation parameters determined by the data processing unit (6). [0009] 9. Navigation system of an aircraft, characterized in that it comprises a device (1) such as that specified in one of claims 7 and 8. [0010] 10. Aircraft, characterized in that it comprises a device (1) such as that specified in one of claims 7 and 8.
类似技术:
公开号 | 公开日 | 专利标题 FR3018383A1|2015-09-11|METHOD AND DEVICE FOR DETERMINING NAVIGATION PARAMETERS OF AN AIRCRAFT DURING A LANDING PHASE EP2353024B1|2016-08-31|Method for geolocating an object by multitelemetry Wang et al.2008|Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles EP2513668B1|2018-01-24|Method for geo-referencing an imaged area Mostafa et al.2001|Direct positioning and orientation systems: How do they work? What is the attainable accuracy Schwarz et al.2004|Mobile mapping systems–state of the art and future trends US10935381B2|2021-03-02|Star tracker-aided airborne or spacecraft terrestrial landmark navigation system EP1724592A1|2006-11-22|System for estimating the speed of an aircraft and its application to the detection of obstacles FR2954494A1|2011-06-24|METHOD OF CALIBRATING A MEASURING INSTRUMENT OF AN OPTRONIC SYSTEM US9383210B2|2016-07-05|Image navigation and registration | transfer from exquisite systems to hosted space payloads US20180080787A1|2018-03-22|Star Tracker-Aided Airborne or Spacecraft Terrestrial Landmark Navigation System Mostafa et al.2001|GPS/IMU products–the Applanix approach Vetrella et al.2015|Cooperative UAV navigation based on distributed multi-antenna GNSS, vision, and MEMS sensors US9243914B2|2016-01-26|Correction of navigation position estimate based on the geometry of passively measured and estimated bearings to near earth objects | Wierzbicki et al.2020|Determining the elements of exterior orientation in aerial triangulation processing using UAV technology Schwarz et al.2007|Digital mobile mapping systems—State of the art and future trends FR3064350A1|2018-09-28|METHOD FOR CALCULATING A SPEED OF AN AIRCRAFT, METHOD FOR CALCULATING A PROTECTIVE RADIUS, POSITIONING SYSTEM AND ASSOCIATED AIRCRAFT Amami2022|The Advantages and Limitations of Low-Cost Single Frequency GPS/MEMS-Based INS Integration Petrovska et al.2013|Aircraft precision landing using integrated GPS/INS system Vezinet et al.2014|Video integration in a GPS/INS hybridization architecture for approach and landing Pu et al.2018|Astronomical vessel heading determination based on simultaneously imaging the moon and the horizon FR3052858A1|2017-12-22|METHOD OF ESTIMATING AN ABSOLUTE ORIENTATION DIRECTION OF AN OPTRONIC SYSTEM Shen2012|Nonlinear modeling of inertial errors by fast orthogonal search algorithm for low cost vehicular navigation FR3071624B1|2019-10-11|DISPLAY SYSTEM, DISPLAY METHOD, AND COMPUTER PROGRAM Aboutaleb2020|Multi-sensor based land vehicles’ positioning in challenging GNSs environments
同族专利:
公开号 | 公开日 FR3018383B1|2017-09-08| US20150253150A1|2015-09-10| US9593963B2|2017-03-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6157876A|1999-10-12|2000-12-05|Honeywell International Inc.|Method and apparatus for navigating an aircraft from an image of the runway| EP1335258A1|2002-01-25|2003-08-13|Airbus France|Method for guiding an aircraft in a final landing phase and corresponding device| US20050125142A1|2003-10-07|2005-06-09|Akihiro Yamane|Navigation apparatus and navigation method with image recognition| US6405975B1|1995-12-19|2002-06-18|The Boeing Company|Airplane ground maneuvering camera system| US6028624A|1997-12-11|2000-02-22|The United States Of America As Represented By The Secretary Of The Army|Method and apparatus for increased visibility through fog and other aerosols| FR2896071A1|2006-01-11|2007-07-13|Airbus France Sas|METHOD AND DEVICE FOR AIDING THE CONTROL OF AN AIRCRAFT DURING AN AUTONOMOUS APPROACH| FR2897840B1|2006-02-27|2009-02-13|Eurocopter France|METHOD AND DEVICE FOR PROCESSING AND VISUALIZING PILOTAGE INFORMATION OF AN AIRCRAFT| FR2928021B1|2008-02-25|2011-06-10|Airbus France|METHOD AND DEVICE FOR DETECTION OF A SURROUNDING AIRCRAFT.| US8284997B2|2009-03-11|2012-10-09|Honeywell International Inc.|Vision-based vehicle navigation system and method| US8406466B2|2009-12-14|2013-03-26|Honeywell International Inc.|Converting aircraft enhanced vision system video to simulated real time video| US20110282580A1|2010-05-11|2011-11-17|Honeywell International Inc.|Method of image based navigation for precision guidance and landing| FR2961897B1|2010-06-25|2012-07-13|Thales Sa|NAVIGATION FILTER FOR A FIELD CORRELATION NAVIGATION SYSTEM| US8600589B2|2012-04-24|2013-12-03|Exelis, Inc.|Point cloud visualization of acceptable helicopter landing zones based on 4D LIDAR| CN104006790A|2013-02-21|2014-08-27|成都海存艾匹科技有限公司|Vision-Based Aircraft Landing Aid|FR2998958B1|2012-12-05|2019-10-18|Thales|METHOD FOR MANAGING AIR DATA OF AN AIRCRAFT| FR3000196B1|2012-12-21|2015-02-06|Airbus Operations Sas|DEVICE FOR PROVIDING NAVIGATION PARAMETER VALUES OF A VEHICLE| US9435661B2|2014-10-08|2016-09-06|Honeywell International Inc.|Systems and methods for attitude fault detection based on air data and aircraft control settings| US9593962B2|2014-10-08|2017-03-14|Honeywell International Inc.|Systems and methods for attitude fault detection based on integrated GNSS/inertial hybrid filter residuals| US9832085B1|2015-01-20|2017-11-28|Mehdi Malboubi|System for estimating unknown attributes of interest in the under-determined inverse problem and a process of accomplishing the same| KR101784955B1|2016-05-04|2017-10-12|엘아이지넥스원 주식회사|Method for calculating flight speed using image information| CN109983307A|2016-09-22|2019-07-05|加利福尼亚大学董事会|SDR for being navigated by LTE signal| DE102016222272B4|2016-11-14|2018-05-30|Volkswagen Aktiengesellschaft|Appreciating an own position| CN109269511B|2018-11-06|2020-01-07|北京理工大学|Curve matching visual navigation method for planet landing in unknown environment| CN109375647A|2018-11-20|2019-02-22|中国航空工业集团公司西安航空计算技术研究所|Miniature multi-source perceptual computing system| FR3097045B1|2019-06-06|2021-05-14|Safran Electronics & Defense|Method and device for resetting an inertial unit of a means of transport from information delivered by a viewfinder of the means of transport| CN111912401B|2020-06-30|2021-08-03|成都飞机工业(集团)有限责任公司|Working space solving method for airplane large-part attitude adjusting mechanism|
法律状态:
2015-03-31| PLFP| Fee payment|Year of fee payment: 2 | 2016-03-31| PLFP| Fee payment|Year of fee payment: 3 | 2017-03-31| PLFP| Fee payment|Year of fee payment: 4 | 2018-03-29| PLFP| Fee payment|Year of fee payment: 5 | 2019-03-29| PLFP| Fee payment|Year of fee payment: 6 | 2020-03-31| PLFP| Fee payment|Year of fee payment: 7 | 2021-03-30| PLFP| Fee payment|Year of fee payment: 8 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1451848A|FR3018383B1|2014-03-07|2014-03-07|METHOD AND DEVICE FOR DETERMINING NAVIGATION PARAMETERS OF AN AIRCRAFT DURING A LANDING PHASE|FR1451848A| FR3018383B1|2014-03-07|2014-03-07|METHOD AND DEVICE FOR DETERMINING NAVIGATION PARAMETERS OF AN AIRCRAFT DURING A LANDING PHASE| US14/637,554| US9593963B2|2014-03-07|2015-03-04|Method and a device for determining navigation parameters of an aircraft during a landing phase| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|